Does Software that Explains Itself Really Help?

2022-04-09

00:00 / 00:00
复读宝 RABC v8.0beta 复读机按钮使用说明
播放/暂停
停止
播放时:倒退3秒/复读时:回退AB段
播放时:快进3秒/复读时:前进AB段
拖动:改变速度/点击:恢复正常速度1.0
拖动改变复读暂停时间
点击:复读最近5秒/拖动:改变复读次数
设置A点
设置B点
取消复读并清除AB点
播放一行
停止播放
后退一行
前进一行
复读一行
复读多行
变速复读一行
变速复读多行
LRC
TXT
大字
小字
滚动
全页
1
  • Scientists who make artificial intelligence (AI) systems say they have no problem designing ones that make good predictions for business decisions.
  • 2
  • But they are finding that the AI may need to explain itself through another algorithm to make such tools effective for the people who use them.
  • 3
  • AI is an area of computer science which aims to give machines abilities that seem like human intelligence.
  • 4
  • "Explainable AI," or XAI, is a new field that has received a lot of investment.
  • 5
  • Small, new companies and large technology companies are competing to make complex software more understandable.
  • 6
  • Government officials in the United States and European Union also want to make sure machines' decision-making is fair and understandable.
  • 7
  • Experts say that AI technology can sometimes increase unfair opinions about race, gender and culture in society.
  • 8
  • Some AI scientists think explanations are an important way to deal with that.
  • 9
  • Over the last two years, U.S. government agencies including the Federal Trade Commission have warned that AI, which is not explainable, could be investigated.
  • 10
  • The European Union could also pass the Artificial Intelligence Act next year.
  • 11
  • That law would require explanations of AI results.
  • 12
  • Supporters of explainable AI say it has helped increase the effectiveness of AI's use in fields like healthcare and sales.
  • 13
  • For example, Microsoft's LinkedIn professional networking service earned 8 percent more money after giving its sales team AI software.
  • 14
  • The software aims to predict the risk of a person canceling a subscription.
  • 15
  • But the software also provides an explanation of why it makes a prediction.
  • 16
  • The system was launched last July.
  • 17
  • It is expected to be described on LinkedIn's website.
  • 18
  • But critics say explanations of AI predictions are not trustworthy.
  • 19
  • They say the AI technology to explain the machines' results is not good enough.
  • 20
  • Developers of explainable AI say that each step in the process should be improved.
  • 21
  • These steps include analyzing predictions, creating explanations, confirming them and making them helpful for users.
  • 22
  • But after two years, LinkedIn said its technology has already created value.
  • 23
  • It said the proof is the 8 percent increase in money from subscription sales during the current financial year.
  • 24
  • Before the AI software, LinkedIn salespeople used their own abilities.
  • 25
  • Now, the AI quickly does research and analysis.
  • 26
  • Called CrystalCandle by LinkedIn, it identifies actions and helps salespeople sell subscriptions and other services.
  • 27
  • LinkedIn said the explanation-based service has extended to more than 5,000 sales employees.
  • 28
  • It includes finding new workers, advertising, marketing and educational offerings.
  • 29
  • "It has helped experienced salespeople by arming them with specific insights," said Parvez Ahammad.
  • 30
  • He is LinkedIn's director of machine learning and head of data science applied research.
  • 31
  • But some AI experts question whether explanations are needed.
  • 32
  • They say explanations could even do harm, creating a false idea of security in AI.
  • 33
  • Researchers say they could also create design changes that are less useful.
  • 34
  • But LinkedIn said an algorithm's strength cannot be understood without understanding its "thinking."
  • 35
  • LinkedIn also said that tools like its CrystalCandle could help AI users in other fields.
  • 36
  • Doctors could learn why AI predicts that someone is more at risk of a disease.
  • 37
  • People could be told why AI recommended that they be denied a credit card.
  • 38
  • Been Kim is an AI researcher at Google.
  • 39
  • She hopes that explanations show whether a system presents ideas and values people want to support.
  • 40
  • She said explanations can create a kind of discussion between machines and humans.
  • 41
  • "If we truly want to enable human-machine collaboration, we need that," Kim said.
  • 42
  • I'm Dan Novak.
  • 1
  • Scientists who make artificial intelligence (AI) systems say they have no problem designing ones that make good predictions for business decisions. But they are finding that the AI may need to explain itself through another algorithm to make such tools effective for the people who use them.
  • 2
  • AI is an area of computer science which aims to give machines abilities that seem like human intelligence.
  • 3
  • "Explainable AI," or XAI, is a new field that has received a lot of investment. Small, new companies and large technology companies are competing to make complex software more understandable. Government officials in the United States and European Union also want to make sure machines' decision-making is fair and understandable.
  • 4
  • Experts say that AI technology can sometimes increase unfair opinions about race, gender and culture in society. Some AI scientists think explanations are an important way to deal with that.
  • 5
  • Over the last two years, U.S. government agencies including the Federal Trade Commission have warned that AI, which is not explainable, could be investigated. The European Union could also pass the Artificial Intelligence Act next year. That law would require explanations of AI results.
  • 6
  • Supporters of explainable AI say it has helped increase the effectiveness of AI's use in fields like healthcare and sales.
  • 7
  • For example, Microsoft's LinkedIn professional networking service earned 8 percent more money after giving its sales team AI software. The software aims to predict the risk of a person canceling a subscription. But the software also provides an explanation of why it makes a prediction.
  • 8
  • The system was launched last July. It is expected to be described on LinkedIn's website.
  • 9
  • But critics say explanations of AI predictions are not trustworthy. They say the AI technology to explain the machines' results is not good enough.
  • 10
  • Developers of explainable AI say that each step in the process should be improved. These steps include analyzing predictions, creating explanations, confirming them and making them helpful for users.
  • 11
  • But after two years, LinkedIn said its technology has already created value. It said the proof is the 8 percent increase in money from subscription sales during the current financial year.
  • 12
  • Before the AI software, LinkedIn salespeople used their own abilities. Now, the AI quickly does research and analysis. Called CrystalCandle by LinkedIn, it identifies actions and helps salespeople sell subscriptions and other services.
  • 13
  • LinkedIn said the explanation-based service has extended to more than 5,000 sales employees. It includes finding new workers, advertising, marketing and educational offerings.
  • 14
  • "It has helped experienced salespeople by arming them with specific insights," said Parvez Ahammad. He is LinkedIn's director of machine learning and head of data science applied research.
  • 15
  • But some AI experts question whether explanations are needed. They say explanations could even do harm, creating a false idea of security in AI. Researchers say they could also create design changes that are less useful.
  • 16
  • But LinkedIn said an algorithm's strength cannot be understood without understanding its "thinking."
  • 17
  • LinkedIn also said that tools like its CrystalCandle could help AI users in other fields. Doctors could learn why AI predicts that someone is more at risk of a disease. People could be told why AI recommended that they be denied a credit card.
  • 18
  • Been Kim is an AI researcher at Google. She hopes that explanations show whether a system presents ideas and values people want to support. She said explanations can create a kind of discussion between machines and humans.
  • 19
  • "If we truly want to enable human-machine collaboration, we need that," Kim said.
  • 20
  • I'm Dan Novak.
  • 21
  • Dan Novak adapted this story for VOA Learning English based on reporting from Reuters.
  • 22
  • _______________________________________________________________
  • 23
  • Words in This Story
  • 24
  • algorithm - n. a set of steps that are followed in order to solve a mathematical problem or to complete a computer process
  • 25
  • subscription - n. an agreement that you make with a company to get a publication or service regularly and that you pay for regularly
  • 26
  • gender - n. the state of being male or female
  • 27
  • analyze -v. to study something closely and carefully; to learn the nature and relationship of the parts of something by a close and careful examination
  • 28
  • insight - n. an understanding of the true nature of something
  • 29
  • collaboration - n. to work with another person or group in order to gain or do something